Basic Statistics on the Twitter Accounts

In this section, some basic statistics for the Twitter Accounts of the given groups of libraries (i.e. National libraries, University libraries, Public libraries) will be collected.

The functions will return a list of dictionaries and save it as a CSV to the cwd.

The dictionaries have as keys:

  • 'created_at' ( = the Twitter Time Stamp),
  • 'created_at_sec' ( = the date in seconds (from 1970-01-01),
  • 'days' (= the number of days since created_at),
  • 'days_since_last_tweet',
  • 'followers_count',
  • 'friends_count',
  • 'id_str' ( = the Twitter ID as a string),
  • 'location' ( = if a location is given in the description of the account),
  • 'screen_name' ( = the Twitter handle/username),
  • 'statuses_count' ( = Nr. of Tweets),
  • 'tweets_per_day',
  • 'tweets_per_year'

Finally, there is a Report section, in which an overview is provided. For each library group will be printed out:

  • The number of libraries,
  • the median of the groups' Tweets per day,
  • the oldest and latest library @ Twitter with their Tweets per day ratio,
  • a list of no longer actively tweeting libraries
  • the libraries with the most and least Tweets and
  • a summary for each library.

Function definitions


In [1]:
# authenticating @ Twitter

# Function definition taken from Mining the Social Web, 2. Ed.
# cf. https://github.com/ptwobrussell/Mining-the-Social-Web-2nd-Edition

'''
Go to http://dev.twitter.com/apps/new to create an app and get values
for these credentials, which you'll need to provide in place of these
empty string values that are defined as placeholders.
See https://dev.twitter.com/docs/auth/oauth for more information 
on Twitter's OAuth implementation.
'''
    
#importing libraries
import twitter
    
CONSUMER_KEY = 
CONSUMER_SECRET =
OAUTH_TOKEN = 
OAUTH_TOKEN_SECRET = 

    
auth = twitter.oauth.OAuth(OAUTH_TOKEN, OAUTH_TOKEN_SECRET,
                               CONSUMER_KEY, CONSUMER_SECRET)
twitter_api = twitter.Twitter(auth=auth)

In [2]:
#import & export CSV
import csv

def impCSV(input_file):
    '''
    input_file = csv with keys: "URL", "Twitter"
    output = list of dictionaries
    '''
    f = open(input_file, 'r')
    d = csv.DictReader(f)
    LoD = []   # list of dictionaries
    for row in d:
        LoD.append(row)
    f.close()
    return LoD

def exp2CSV(listOfDict, filename):
    '''
    arguments = list of dictionaries, filename
    output = saves file to cwd (current working directory)
    '''
    #creating the filename of the csv with current datestamp 
    import datetime
    datestamp = datetime.datetime.now().strftime('%Y-%m-%d')    
    outputfile = filename[:-4]+ '_' + datestamp + '.csv'
    keyz = listOfDict[0].keys()
    f = open(outputfile,'w')
    dict_writer = csv.DictWriter(f,keyz)
    dict_writer.writer.writerow(keyz)
    dict_writer.writerows(listOfDict)
    f.close()

In [3]:
###################################
#                                 #
#  Functions for the Data Mining  #
#                                 #
###################################


#importing libraries
import json                           #for pretty printing
import time                           #for calculating Tweets per day
import operator                       #for sorting dictionaries
from collections import Counter       #for turning lists to dictionaries etc.
from prettytable import PrettyTable   #for pretty printing in a table


# getting the ListOfScreenNames
def getLoSN(csvFile):
    '''
    input = csv filename of list of dictionaries with a key "Twitter" (where the Screenname is saved)
    returns a list of tuples with t[0] = libLocation, t[1] = Twitter screenname
    '''
    LoD = impCSV(csvFile)
    ListOfScreenNamesLocationTuples = []
    for i in LoD:
        ListOfScreenNamesLocationTuples.append((i['Ort'], i['Twitter']))
    return ListOfScreenNamesLocationTuples



#getting basic infos for a given account incl. last status update
# users.lookup = max. 100 Anfragen pro Session! Not a problem in this section of the queries.
def AccountInfo(L):
    '''
    input = list of tuples with str of screen_names and location
    output = list of tuples with t[0] = libLocation, t[1] = lists of dictionaries
    '''
    outputList = []
    errorList = []   #implementation of error checking via "try" or something like that!
    for n in L:
        search_results = twitter_api.users.lookup(screen_name=n[1])
        outputList.append((n[0], search_results))
    return outputList



# getting some basic stats for the screen_names
def baseStats(AccountInfoList):
    '''
    input = return list from AccountInfo(L)
    output: list of dictionaries with screenName, UserID, nrOfFollowers, nrOfFriends, 
    nrOfStatusUpdates, tweetsSince, tweetsPerDay, and tweetsPerYear
    '''
    AccountInfoList[1]
    baseStatsList = []
    for e in range(len(AccountInfoList)):
        newDict = {}   #creating a new dictionary for each account
        screenName = AccountInfoList[e][1][0]['screen_name'].lower()  # cf. @ Notebook 3 - Twitter CSV files
        UserID = AccountInfoList[e][1][0]['id_str'].encode('utf-8')
        nrOfFollowers = AccountInfoList[e][1][0]['followers_count']   #How many Followers?
        nrOfFriends = AccountInfoList[e][1][0]['friends_count']   #How many Following/Friends?
        nrOfStatusUpdates = AccountInfoList[e][1][0]['statuses_count']
        tweetsSince = AccountInfoList[e][1][0]['created_at'].encode('utf-8')
        #new in Dict:
        DateOfLastTweet = AccountInfoList[e][1][0]['status']['created_at'].encode('utf-8')
        
        #normalizing the location
        
        '''
        # This code is only necessary if the Twitter location is used instead of the DBS location
        # location = AccountInfoList[e][1][0]['location'].encode('utf-8')   #get the location (in case the screen_name isn't sufficient)
        # list of words to remove from the location's description (Bundesländer & Country)
        removeWords = ['Deutschland', 'Germany', 'Baden-Württemberg', 'Bayern', 'Brandenburg', 'Hessen', 'Mecklenburg-Vorpommern', 
              'Niedersachsen', 'Nordrhein-Westfalen', 'Rheinland-Pfalz', 'Saarland', 'Sachsen', 
              'Sachsen-Anhalt', 'Schleswig-Holstein','Thüringen'] #ausser 'Berlin', 'Bremen', 'Hamburg'!

        #normalizing location (lowercase, stripping of Germany etc.) ("Oldenburg, Germany", "Hessen, Kassel"))
        location = (location.replace(",", "")).lower()   #remove separator and normalize to lowercase
        for e in removeWords:   #remove Bundesland and/or Country
            if e.lower() in location:
                location = location.strip(e.lower())
                location = location.strip()   #strip off white space
        '''
        location = AccountInfoList[e][0].lower()
        idxLoc1 = location.find('/')          # strip off everything from '/' on to the right (e.g. 'Frankfurt/M')
        idxLoc2 = location.find('-')          # strip off everything from '-' on to the right (e.g. 'Duisburg-Essen')
        if idxLoc1 != -1:
            location = location[:idxLoc1]
        if idxLoc2 != -1:
            location = location[:idxLoc2]
        if 'sporths' in location:
            location = location.strip('sporths')   # the lib of KölnSportHS has given that as their location!
         
        
        #calculating Tweets per day and year
        t0 = time.mktime(time.strptime(tweetsSince, "%a %b %d %H:%M:%S +0000 %Y"))#returns date in seconds (from 1970-01-01)
        t1 = time.time() #returns current date in seconds (from 1970-01-01)
        diff = int(round((t1 - t0)/86400)) #calculates the difference in days (86400 sec per day)
        tweetsPerDay = round((float(nrOfStatusUpdates)/diff),2)   #returns nr of Tweets per day as a float
        diffYear = round((diff/365.0),2)
        tweetsPerYear = round((float(nrOfStatusUpdates)/diffYear),2)   #returns nr of Tweets per year as a float
        
        #calculating time since last Tweet
        LastTweet_t0 = time.mktime(time.strptime(DateOfLastTweet, "%a %b %d %H:%M:%S +0000 %Y"))
        daysSinceLastTweet = int(round((t1 - LastTweet_t0)/86400))
        
        #writing to the dictionary
        newDict['screen_name'] = screenName
        newDict['id_str'] = UserID
        newDict['location'] = location
        newDict['followers_count'] = nrOfFollowers
        newDict['friends_count'] = nrOfFriends
        newDict['statuses_count'] = nrOfStatusUpdates
        newDict['created_at'] = tweetsSince
        newDict['created_at_sec'] = t0
        newDict['days'] = diff
        newDict['tweets_per_day'] = tweetsPerDay
        newDict['tweets_per_year'] = tweetsPerYear
        newDict['days_since_last_tweet'] = daysSinceLastTweet
        baseStatsList.append(newDict) #writing to the List
        
    return baseStatsList



########################################
#                                      #
#  Function for the reporting section  #
#                                      #
########################################


#return the median of Tweets per Day
def medianOfTPD(LoD):
    l = []
    for e in LoD:
        l.append(e['tweets_per_day'])
    l.sort()
    if len(l)%2 != 0:
        median = l[len(l)/2]
    else:
        median = (l[len(l)/2-1] + l[len(l)/2])/2.0
    return median


#Sorting the Accounts based on created_at
def sortingDate(L):
    '''
    input = baseStats(StatusLists) or list of dicts
    output = sorted list of dicts from oldest to newest account
    '''
    l=L[:]
    l.sort(key=operator.itemgetter('created_at_sec'))
    return l


#Sorting the Accounts based on days_since_last_tweet
def sortingDateOfLastTweet(L):
    '''
    input = baseStats(StatusLists) or list of dicts
    output = sorted list of dicts from oldest to newest account
    '''
    l=L[:]
    l.sort(key=operator.itemgetter('days_since_last_tweet'))
    return l

#get the inactive accounts (i.e. accounts without a Tweet in the last 100 days
def getInactiveAccounts(sortingDateOfLastTweet):
    l = []
    for e in sortingDateOfLastTweet:
        if e['days_since_last_tweet'] > 100:
            l.append(e['screen_name'])
    if len(l) == 0:
        print 'There is no inactive library in this group. (I.e. all libraries have tweeted in the last 100 days.)'
    elif len(l) == 1:
        print l[0], "hasn't tweeted in the last 100 days. This library can be considered inactive on Twitter." 
    else:
        s = ", ".join(l)
        print s, "haven't tweeted in the last 100 days. These libraries can be considered inactive on Twitter." 


        
#Sorting the Accounts based on number of Tweets
def sortingTweets(L):
    '''
    input = baseStatsList(StatusLists) or list of dicts
    output = sorted list of dicts from lousiest Tweeter to SocialMedia Addict
    '''
    li=L[:]
    import operator
    li.sort(key=operator.itemgetter('statuses_count'))
    return li
        
#Printing a summary sorted by date
def printSummary(dictList):
    '''a small function to print a summary of the list of dicts sorted by date'''
    for e in range(len(dictList)):
        print dictList[e]['location'], ':', dictList[e]['screen_name'], '= UserID:', dictList[e]['id_str']
        print '--> Followers:', dictList[e]['followers_count'], '; Following:', dictList[e]['friends_count'], '; Tweets:', dictList[e]['statuses_count']
        print '--> Tweets since:', dictList[e]['created_at'][4:7], dictList[e]['created_at'][-4:], '=', dictList[e]['days'], 'days', '; Tweets per day:', dictList[e]['tweets_per_day']
        print

Requesting Data

1. National Libraries


In [4]:
# 1: get the list of screennames
# ==> insert csv-name  !!
NatBib_libList = getLoSN('NatBibTwitter.csv')
print len(NatBib_libList), 'libraries were data mined'

# 2: get the account information for each screenname
NatBib_accountInfoList = AccountInfo(NatBib_libList)

# 3: get some basic stats and write them to a list of dictionaries
NatBib_baseStatsList = baseStats(NatBib_accountInfoList)

# 4: save this LoD as a csv to the cwd
# ==> insert csv-name  !!
exp2CSV(NatBib_baseStatsList, 'NatBib_BasicStats.csv')
print 'The findings were saved as a CSV file to your cwd as NatBib_BasicStats_[current datestamp].csv'


3 libraries were data mined
The findings were saved as a CSV file to your cwd as NatBib_BasicStats_[current datestamp].csv

2. University Libraries


In [5]:
# 1: get the list of screennames
# ==> insert csv-name  !!
UniBib_libList = getLoSN('UniBibTwitter.csv')
print len(UniBib_libList), 'libraries were data mined'

# 2: get the account information for each screenname
UniBib_accountInfoList = AccountInfo(UniBib_libList)

# 3: get some basic stats and write them to a list of dictionaries
UniBib_baseStatsList = baseStats(UniBib_accountInfoList)

# 4: save this LoD as a csv to the cwd
# ==> insert csv-name  !!
exp2CSV(UniBib_baseStatsList, 'UniBib_BasicStats.csv')
print 'The findings were saved as a CSV file to your cwd as UniBib_BasicStats_[current datestamp].csv.'


27 libraries were data mined
The findings were saved as a CSV file to your cwd as UniBib_BasicStats_[current datestamp].csv.

3. Public Libraries


In [8]:
# 1: get the list of screennames
# ==> insert csv-name  !!
OeBib_libList = getLoSN('OeBibTwitter.csv')
print len(OeBib_libList), 'libraries were queried.'

# 2: get the account information for each screenname
OeBib_accountInfoList = AccountInfo(OeBib_libList)

# 3: get some basic stats and write them to a list of dictionaries
OeBib_baseStatsList = baseStats(OeBib_accountInfoList)

# 4: save this LoD as a csv to the cwd
# ==> insert csv-name  !!
exp2CSV(OeBib_baseStatsList, 'OeBib_BasicStats.csv')
print 'The findings were saved as a CSV file to your cwd as OeBib_BasicStats_[current datestamp].csv.'


21 libraries were queried.
The findings were saved as a CSV file to your cwd as OeBib_BasicStats_[current datestamp].csv.

Report

National Libraries


In [9]:
NatBib_median = medianOfTPD(NatBib_baseStatsList)

NatBib_dateSortList = sortingDate(NatBib_baseStatsList)

NatBib_tweetSortList = sortingTweets(NatBib_baseStatsList)

#--------

print 'There are', len(NatBib_libList), 'libraries in this category.'
print
print 'Taken the median, on average these libraries send about', NatBib_median, 'Tweets per day.'
print
print 'Oldest account:', NatBib_dateSortList[0]['screen_name'], 'with', NatBib_dateSortList[0]['tweets_per_day'], 'Tweets per day.' 
print 'Latest account:', NatBib_dateSortList[-1]['screen_name'], 'with', NatBib_dateSortList[-1]['tweets_per_day'], 'Tweets per day.' 
print
getInactiveAccounts(NatBib_baseStatsList)
print
print 'Lousiest Tweeter:', NatBib_tweetSortList[0]['screen_name'], 'with', NatBib_tweetSortList[0]['statuses_count'], 'Tweets.' 
print 'SocialMedia Addict:', NatBib_tweetSortList[-1]['screen_name'], 'with', NatBib_tweetSortList[-1]['statuses_count'], 'Tweets.' 
print
printSummary(NatBib_dateSortList)


There are 3 libraries in this category.

Taken the median, on average these libraries send about 0.7 Tweets per day.

Oldest account: bsb_muenchen with 0.7 Tweets per day.
Latest account: dnb_aktuelles with 0.75 Tweets per day.

There is no inactive library in this group. (I.e. all libraries have tweeted in the last 100 days.)

Lousiest Tweeter: dnb_aktuelles with 464 Tweets.
SocialMedia Addict: bsb_muenchen with 1256 Tweets.

münchen : bsb_muenchen = UserID: 39468408
--> Followers: 1727 ; Following: 95 ; Tweets: 1256
--> Tweets since: May 2009 = 1791 days ; Tweets per day: 0.7

berlin : sbb_news = UserID: 47622879
--> Followers: 1313 ; Following: 0 ; Tweets: 844
--> Tweets since: Jun 2009 = 1756 days ; Tweets per day: 0.48

frankfurt : dnb_aktuelles = UserID: 714103813
--> Followers: 558 ; Following: 85 ; Tweets: 464
--> Tweets since: Jul 2012 = 622 days ; Tweets per day: 0.75

University Libraries


In [10]:
UniBib_median = medianOfTPD(UniBib_baseStatsList)

UniBib_dateSortList = sortingDate(UniBib_baseStatsList)

UniBib_tweetSortList = sortingTweets(UniBib_baseStatsList)


#--------

print 'There are', len(UniBib_libList), 'libraries in this category.'
print
print 'Taken the median, on average these libraries send about', UniBib_median, 'Tweets per day.'
print
print 'Oldest account:', UniBib_dateSortList[0]['screen_name'], 'with', UniBib_dateSortList[0]['tweets_per_day'], 'Tweets per day.' 
print 'Latest account:', UniBib_dateSortList[-1]['screen_name'], 'with', UniBib_dateSortList[-1]['tweets_per_day'], 'Tweets per day.' 
print
getInactiveAccounts(UniBib_baseStatsList)
print
print 'Lousiest Tweeter:', UniBib_tweetSortList[0]['screen_name'], 'with', UniBib_tweetSortList[0]['statuses_count'], 'Tweets.' 
print 'SocialMedia Addict:', UniBib_tweetSortList[-1]['screen_name'], 'with', UniBib_tweetSortList[-1]['statuses_count'], 'Tweets.' 
print
printSummary(UniBib_dateSortList)


There are 27 libraries in this category.

Taken the median, on average these libraries send about 0.41 Tweets per day.

Oldest account: elibbremen with 1.13 Tweets per day.
Latest account: kizuulm with 0.13 Tweets per day.

ub_oldenburg, ubbayreuth_info, zbsport haven't tweeted in the last 100 days. These libraries can be considered inactive on Twitter.

Lousiest Tweeter: ubbayreuth_info with 23 Tweets.
SocialMedia Addict: stabihh with 5546 Tweets.

bremen : elibbremen = UserID: 20671215
--> Followers: 1079 ; Following: 755 ; Tweets: 2129
--> Tweets since: Feb 2009 = 1880 days ; Tweets per day: 1.13

hannover : tibub = UserID: 22807586
--> Followers: 1571 ; Following: 1025 ; Tweets: 2143
--> Tweets since: Mar 2009 = 1860 days ; Tweets per day: 1.15

hamburg : tubhh = UserID: 25095207
--> Followers: 653 ; Following: 30 ; Tweets: 1022
--> Tweets since: Mar 2009 = 1846 days ; Tweets per day: 0.55

köln : zbsport = UserID: 25307940
--> Followers: 503 ; Following: 2 ; Tweets: 58
--> Tweets since: Mar 2009 = 1845 days ; Tweets per day: 0.03

bayreuth : ubbayreuth_info = UserID: 25879034
--> Followers: 354 ; Following: 0 ; Tweets: 23
--> Tweets since: Mar 2009 = 1842 days ; Tweets per day: 0.01

dortmund : unibib = UserID: 38386894
--> Followers: 1067 ; Following: 12 ; Tweets: 358
--> Tweets since: May 2009 = 1797 days ; Tweets per day: 0.2

oldenburg : ub_oldenburg = UserID: 39998615
--> Followers: 164 ; Following: 27 ; Tweets: 50
--> Tweets since: May 2009 = 1789 days ; Tweets per day: 0.03

dresden : slubdresden = UserID: 40217644
--> Followers: 3817 ; Following: 826 ; Tweets: 2254
--> Tweets since: May 2009 = 1788 days ; Tweets per day: 1.26

bochum : ubbochum = UserID: 41152402
--> Followers: 1070 ; Following: 218 ; Tweets: 1039
--> Tweets since: May 2009 = 1784 days ; Tweets per day: 0.58

berlin : ubhumboldtuni = UserID: 47903084
--> Followers: 488 ; Following: 25 ; Tweets: 603
--> Tweets since: Jun 2009 = 1755 days ; Tweets per day: 0.34

hamburg : stabihh = UserID: 52349626
--> Followers: 1730 ; Following: 250 ; Tweets: 5546
--> Tweets since: Jun 2009 = 1742 days ; Tweets per day: 3.18

göttingen : subugoe = UserID: 67543866
--> Followers: 182 ; Following: 48 ; Tweets: 243
--> Tweets since: Aug 2009 = 1691 days ; Tweets per day: 0.14

hamburg : hsubib = UserID: 74455176
--> Followers: 699 ; Following: 166 ; Tweets: 1374
--> Tweets since: Sep 2009 = 1665 days ; Tweets per day: 0.83

leipzig : ubleipzig = UserID: 76315366
--> Followers: 1163 ; Following: 232 ; Tweets: 632
--> Tweets since: Sep 2009 = 1658 days ; Tweets per day: 0.38

duisburg : ubdue = UserID: 86322871
--> Followers: 797 ; Following: 34 ; Tweets: 887
--> Tweets since: Oct 2009 = 1620 days ; Tweets per day: 0.55

regensburg : ubreg = UserID: 87673073
--> Followers: 750 ; Following: 254 ; Tweets: 1274
--> Tweets since: Nov 2009 = 1614 days ; Tweets per day: 0.79

berlin : ub_tu_berlin = UserID: 105141688
--> Followers: 879 ; Following: 377 ; Tweets: 740
--> Tweets since: Jan 2010 = 1543 days ; Tweets per day: 0.48

kassel : ubkassel = UserID: 140830161
--> Followers: 172 ; Following: 136 ; Tweets: 258
--> Tweets since: May 2010 = 1432 days ; Tweets per day: 0.18

würzburg : ub_wue = UserID: 164600181
--> Followers: 220 ; Following: 17 ; Tweets: 406
--> Tweets since: Jul 2010 = 1369 days ; Tweets per day: 0.3

bonn : ulbbonn = UserID: 166924284
--> Followers: 411 ; Following: 36 ; Tweets: 596
--> Tweets since: Jul 2010 = 1362 days ; Tweets per day: 0.44

braunschweig : unibib_bs = UserID: 190994679
--> Followers: 123 ; Following: 1 ; Tweets: 456
--> Tweets since: Sep 2010 = 1300 days ; Tweets per day: 0.35

erlangen : ub_fau = UserID: 196916415
--> Followers: 431 ; Following: 24 ; Tweets: 528
--> Tweets since: Sep 2010 = 1286 days ; Tweets per day: 0.41

marburg : unibib_mr = UserID: 418611112
--> Followers: 307 ; Following: 49 ; Tweets: 257
--> Tweets since: Nov 2011 = 867 days ; Tweets per day: 0.3

bielefeld : ub_bi = UserID: 508565664
--> Followers: 274 ; Following: 239 ; Tweets: 812
--> Tweets since: Feb 2012 = 768 days ; Tweets per day: 1.06

mainz : ubmainz = UserID: 602631712
--> Followers: 654 ; Following: 1068 ; Tweets: 2343
--> Tweets since: Jun 2012 = 668 days ; Tweets per day: 3.51

karlsruhe : kitbibliothek = UserID: 607258639
--> Followers: 269 ; Following: 20 ; Tweets: 242
--> Tweets since: Jun 2012 = 663 days ; Tweets per day: 0.37

ulm : kizuulm = UserID: 1222219579
--> Followers: 143 ; Following: 64 ; Tweets: 54
--> Tweets since: Feb 2013 = 405 days ; Tweets per day: 0.13

Public Libraries


In [11]:
OeBib_median = medianOfTPD(OeBib_baseStatsList)

OeBib_dateSortList = sortingDate(OeBib_baseStatsList)

OeBib_tweetSortList = sortingTweets(OeBib_baseStatsList)


#--------

print 'There are', len(OeBib_libList), 'libraries in this category.'
print
print 'Taken the median, on average these libraries send about', OeBib_median, 'Tweets per day.'
print
print 'Oldest account:', OeBib_dateSortList[0]['screen_name'], 'with', OeBib_dateSortList[0]['tweets_per_day'], 'Tweets per day.' 
print 'Latest account:', OeBib_dateSortList[-1]['screen_name'], 'with', OeBib_dateSortList[-1]['tweets_per_day'], 'Tweets per day.' 
print
getInactiveAccounts(OeBib_baseStatsList)
print
print 'Lousiest Tweeter:', OeBib_tweetSortList[0]['screen_name'], 'with', OeBib_tweetSortList[0]['statuses_count'], 'Tweets.' 
print 'SocialMedia Addict:', OeBib_tweetSortList[-1]['screen_name'], 'with', OeBib_tweetSortList[-1]['statuses_count'], 'Tweets.' 
print
printSummary(OeBib_dateSortList)


There are 21 libraries in this category.

Taken the median, on average these libraries send about 1.09 Tweets per day.

Oldest account: stabuewuerzburg with 2.46 Tweets per day.
Latest account: bibliothek_wit with 0.61 Tweets per day.

buecherei_ms hasn't tweeted in the last 100 days. This library can be considered inactive on Twitter.

Lousiest Tweeter: buecherei_ms with 95 Tweets.
SocialMedia Addict: stbibkoeln with 9609 Tweets.

würzburg : stabuewuerzburg = UserID: 15178997
--> Followers: 413 ; Following: 194 ; Tweets: 5208
--> Tweets since: Jun 2008 = 2117 days ; Tweets per day: 2.46

erlangen : stabi_erlangen = UserID: 21177090
--> Followers: 988 ; Following: 497 ; Tweets: 6750
--> Tweets since: Feb 2009 = 1875 days ; Tweets per day: 3.6

berlin : stadtbibliothek = UserID: 32955209
--> Followers: 283 ; Following: 17 ; Tweets: 420
--> Tweets since: Apr 2009 = 1815 days ; Tweets per day: 0.23

solingen : stabiso = UserID: 40726549
--> Followers: 327 ; Following: 28 ; Tweets: 2227
--> Tweets since: May 2009 = 1786 days ; Tweets per day: 1.25

bremen : stabi_bremen = UserID: 46082445
--> Followers: 981 ; Following: 32 ; Tweets: 242
--> Tweets since: Jun 2009 = 1762 days ; Tweets per day: 0.14

hamburg : hoeb4u = UserID: 47645067
--> Followers: 210 ; Following: 131 ; Tweets: 5021
--> Tweets since: Jun 2009 = 1756 days ; Tweets per day: 2.86

chemnitz : sbchemnitz = UserID: 63668003
--> Followers: 1103 ; Following: 1021 ; Tweets: 1122
--> Tweets since: Aug 2009 = 1705 days ; Tweets per day: 0.66

göttingen : stabigoe = UserID: 67596714
--> Followers: 333 ; Following: 91 ; Tweets: 2301
--> Tweets since: Aug 2009 = 1690 days ; Tweets per day: 1.36

freiburg : stabifr = UserID: 69244340
--> Followers: 623 ; Following: 394 ; Tweets: 2058
--> Tweets since: Aug 2009 = 1684 days ; Tweets per day: 1.22

neuss : stbneuss = UserID: 88621648
--> Followers: 397 ; Following: 585 ; Tweets: 1575
--> Tweets since: Nov 2009 = 1610 days ; Tweets per day: 0.98

krefeld : mediothek = UserID: 107376250
--> Followers: 668 ; Following: 850 ; Tweets: 2731
--> Tweets since: Jan 2010 = 1536 days ; Tweets per day: 1.78

gütersloh : stabiguetersloh = UserID: 118645887
--> Followers: 445 ; Following: 169 ; Tweets: 550
--> Tweets since: Mar 2010 = 1498 days ; Tweets per day: 0.37

köln : stbibkoeln = UserID: 130538657
--> Followers: 2120 ; Following: 1066 ; Tweets: 9609
--> Tweets since: Apr 2010 = 1461 days ; Tweets per day: 6.58

mannheim : stabi_mannheim = UserID: 139636927
--> Followers: 471 ; Following: 265 ; Tweets: 841
--> Tweets since: May 2010 = 1436 days ; Tweets per day: 0.59

bielefeld : stb_bielefeld = UserID: 150604211
--> Followers: 358 ; Following: 15 ; Tweets: 2448
--> Tweets since: Jun 2010 = 1406 days ; Tweets per day: 1.74

essen : stbessen = UserID: 180672165
--> Followers: 357 ; Following: 77 ; Tweets: 832
--> Tweets since: Aug 2010 = 1327 days ; Tweets per day: 0.63

salzgitter : stbsalzgitter = UserID: 357509606
--> Followers: 228 ; Following: 192 ; Tweets: 1024
--> Tweets since: Aug 2011 = 963 days ; Tweets per day: 1.06

münster : buecherei_ms = UserID: 377233907
--> Followers: 108 ; Following: 3 ; Tweets: 95
--> Tweets since: Sep 2011 = 930 days ; Tweets per day: 0.1

mönchengladbach : stadtbibmg = UserID: 417924693
--> Followers: 320 ; Following: 703 ; Tweets: 942
--> Tweets since: Nov 2011 = 868 days ; Tweets per day: 1.09

düsseldorf : stadtbueduedorf = UserID: 934766042
--> Followers: 433 ; Following: 921 ; Tweets: 669
--> Tweets since: Nov 2012 = 515 days ; Tweets per day: 1.3

witten : bibliothek_wit = UserID: 1617127052
--> Followers: 51 ; Following: 50 ; Tweets: 158
--> Tweets since: Jul 2013 = 258 days ; Tweets per day: 0.61


In [ ]: